Console Output
Training and evaluating model for: Freezer
Dataset length: 13782 windows
NILMModel(
(conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
(lstm): LSTM(9, 128, num_layers=3, batch_first=True, dropout=0.1)
(dropout): Dropout(p=0.1, inplace=False)
(relu): ReLU()
(output_layer): Linear(in_features=128, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.006089
Validation Loss: 0.005949
Epoch [2/300], Train Loss: 0.005719
Validation Loss: 0.005285
Epoch [3/300], Train Loss: 0.003880
Validation Loss: 0.002576
Epoch [4/300], Train Loss: 0.002425
Validation Loss: 0.002193
Epoch [5/300], Train Loss: 0.002108
Validation Loss: 0.001947
Epoch [6/300], Train Loss: 0.001956
Validation Loss: 0.001842
Epoch [7/300], Train Loss: 0.001915
Validation Loss: 0.001784
Epoch [8/300], Train Loss: 0.001850
Validation Loss: 0.001736
Epoch [9/300], Train Loss: 0.001820
Validation Loss: 0.001696
Epoch [10/300], Train Loss: 0.001764
Validation Loss: 0.001667
Epoch [11/300], Train Loss: 0.001730
Validation Loss: 0.001633
Epoch [12/300], Train Loss: 0.001730
Validation Loss: 0.001784
Epoch [13/300], Train Loss: 0.001710
Validation Loss: 0.001645
Epoch [14/300], Train Loss: 0.001686
Validation Loss: 0.001579
Epoch [15/300], Train Loss: 0.001659
Validation Loss: 0.001690
Epoch [16/300], Train Loss: 0.001663
Validation Loss: 0.001603
Epoch [17/300], Train Loss: 0.001651
Validation Loss: 0.001564
Epoch [18/300], Train Loss: 0.001637
Validation Loss: 0.001556
Epoch [19/300], Train Loss: 0.001631
Validation Loss: 0.001591
Epoch [20/300], Train Loss: 0.001622
Validation Loss: 0.001590
Epoch [21/300], Train Loss: 0.001614
Validation Loss: 0.001569
Epoch [22/300], Train Loss: 0.001610
Validation Loss: 0.001531
Epoch [23/300], Train Loss: 0.001608
Validation Loss: 0.001543
Epoch [24/300], Train Loss: 0.001600
Validation Loss: 0.001517
Epoch [25/300], Train Loss: 0.001599
Validation Loss: 0.001523
Epoch [26/300], Train Loss: 0.001587
Validation Loss: 0.001506
Epoch [27/300], Train Loss: 0.001560
Validation Loss: 0.001506
Epoch [28/300], Train Loss: 0.001558
Validation Loss: 0.001492
Epoch [29/300], Train Loss: 0.001567
Validation Loss: 0.001529
Epoch [30/300], Train Loss: 0.001550
Validation Loss: 0.001479
Epoch [31/300], Train Loss: 0.001541
Validation Loss: 0.001484
Epoch [32/300], Train Loss: 0.001540
Validation Loss: 0.001472
Epoch [33/300], Train Loss: 0.001536
Validation Loss: 0.001567
Epoch [34/300], Train Loss: 0.001532
Validation Loss: 0.001525
Epoch [35/300], Train Loss: 0.001531
Validation Loss: 0.001476
Epoch [36/300], Train Loss: 0.001548
Validation Loss: 0.001518
Epoch [37/300], Train Loss: 0.001527
Validation Loss: 0.001466
Epoch [38/300], Train Loss: 0.001535
Validation Loss: 0.001464
Epoch [39/300], Train Loss: 0.001513
Validation Loss: 0.001465
Epoch [40/300], Train Loss: 0.001510
Validation Loss: 0.001454
Epoch [41/300], Train Loss: 0.001551
Validation Loss: 0.001762
Epoch [42/300], Train Loss: 0.001617
Validation Loss: 0.001483
Epoch [43/300], Train Loss: 0.001515
Validation Loss: 0.001459
Epoch [44/300], Train Loss: 0.001488
Validation Loss: 0.001436
Epoch [45/300], Train Loss: 0.001483
Validation Loss: 0.001469
Epoch [46/300], Train Loss: 0.001513
Validation Loss: 0.001482
Epoch [47/300], Train Loss: 0.001508
Validation Loss: 0.001718
Epoch [48/300], Train Loss: 0.001533
Validation Loss: 0.001434
Epoch [49/300], Train Loss: 0.001483
Validation Loss: 0.001436
Epoch [50/300], Train Loss: 0.001473
Validation Loss: 0.001460
Epoch [51/300], Train Loss: 0.001473
Validation Loss: 0.001491
Epoch [52/300], Train Loss: 0.001477
Validation Loss: 0.001425
Epoch [53/300], Train Loss: 0.001475
Validation Loss: 0.001419
Epoch [54/300], Train Loss: 0.001471
Validation Loss: 0.001553
Epoch [55/300], Train Loss: 0.001512
Validation Loss: 0.001435
Epoch [56/300], Train Loss: 0.001476
Validation Loss: 0.001440
Epoch [57/300], Train Loss: 0.001463
Validation Loss: 0.001417
Epoch [58/300], Train Loss: 0.001467
Validation Loss: 0.001418
Epoch [59/300], Train Loss: 0.001498
Validation Loss: 0.001458
Epoch [60/300], Train Loss: 0.001460
Validation Loss: 0.001434
Epoch [61/300], Train Loss: 0.001450
Validation Loss: 0.001420
Epoch [62/300], Train Loss: 0.001439
Validation Loss: 0.001408
Epoch [63/300], Train Loss: 0.001501
Validation Loss: 0.001428
Epoch [64/300], Train Loss: 0.001462
Validation Loss: 0.001421
Epoch [65/300], Train Loss: 0.001440
Validation Loss: 0.001404
Epoch [66/300], Train Loss: 0.001432
Validation Loss: 0.001396
Epoch [67/300], Train Loss: 0.001443
Validation Loss: 0.001457
Epoch [68/300], Train Loss: 0.001525
Validation Loss: 0.001594
Epoch [69/300], Train Loss: 0.001545
Validation Loss: 0.001459
Epoch [70/300], Train Loss: 0.001512
Validation Loss: 0.001447
Epoch [71/300], Train Loss: 0.001486
Validation Loss: 0.001440
Epoch [72/300], Train Loss: 0.001454
Validation Loss: 0.001430
Epoch [73/300], Train Loss: 0.001442
Validation Loss: 0.001426
Epoch [74/300], Train Loss: 0.001433
Validation Loss: 0.001443
Epoch [75/300], Train Loss: 0.001424
Validation Loss: 0.001395
Epoch [76/300], Train Loss: 0.001435
Validation Loss: 0.001403
Epoch [77/300], Train Loss: 0.001433
Validation Loss: 0.001400
Epoch [78/300], Train Loss: 0.001421
Validation Loss: 0.001390
Epoch [79/300], Train Loss: 0.001433
Validation Loss: 0.001386
Epoch [80/300], Train Loss: 0.001456
Validation Loss: 0.001400
Epoch [81/300], Train Loss: 0.001422
Validation Loss: 0.001388
Epoch [82/300], Train Loss: 0.001418
Validation Loss: 0.001432
Epoch [83/300], Train Loss: 0.001434
Validation Loss: 0.001379
Epoch [84/300], Train Loss: 0.001450
Validation Loss: 0.001418
Epoch [85/300], Train Loss: 0.001418
Validation Loss: 0.001383
Epoch [86/300], Train Loss: 0.001422
Validation Loss: 0.001405
Epoch [87/300], Train Loss: 0.001423
Validation Loss: 0.001378
Epoch [88/300], Train Loss: 0.001412
Validation Loss: 0.001394
Epoch [89/300], Train Loss: 0.001417
Validation Loss: 0.001372
Epoch [90/300], Train Loss: 0.001456
Validation Loss: 0.001423
Epoch [91/300], Train Loss: 0.001421
Validation Loss: 0.001402
Epoch [92/300], Train Loss: 0.001413
Validation Loss: 0.001398
Epoch [93/300], Train Loss: 0.001401
Validation Loss: 0.001397
Epoch [94/300], Train Loss: 0.001401
Validation Loss: 0.001404
Epoch [95/300], Train Loss: 0.001411
Validation Loss: 0.001363
Epoch [96/300], Train Loss: 0.001406
Validation Loss: 0.001379
Epoch [97/300], Train Loss: 0.001415
Validation Loss: 0.001369
Epoch [98/300], Train Loss: 0.001406
Validation Loss: 0.001454
Epoch [99/300], Train Loss: 0.001462
Validation Loss: 0.001458
Epoch [100/300], Train Loss: 0.001496
Validation Loss: 0.001415
Epoch [101/300], Train Loss: 0.001450
Validation Loss: 0.001390
Epoch [102/300], Train Loss: 0.001401
Validation Loss: 0.001369
Epoch [103/300], Train Loss: 0.001400
Validation Loss: 0.001372
Epoch [104/300], Train Loss: 0.001408
Validation Loss: 0.001364
Epoch [105/300], Train Loss: 0.001399
Validation Loss: 0.001355
Epoch [106/300], Train Loss: 0.001391
Validation Loss: 0.001379
Epoch [107/300], Train Loss: 0.001398
Validation Loss: 0.001357
Epoch [108/300], Train Loss: 0.001392
Validation Loss: 0.001359
Epoch [109/300], Train Loss: 0.001402
Validation Loss: 0.001355
Epoch [110/300], Train Loss: 0.001399
Validation Loss: 0.001363
Epoch [111/300], Train Loss: 0.001388
Validation Loss: 0.001376
Epoch [112/300], Train Loss: 0.001425
Validation Loss: 0.001357
Epoch [113/300], Train Loss: 0.001411
Validation Loss: 0.001358
Epoch [114/300], Train Loss: 0.001394
Validation Loss: 0.001355
Epoch [115/300], Train Loss: 0.001408
Validation Loss: 0.001348
Epoch [116/300], Train Loss: 0.001381
Validation Loss: 0.001347
Epoch [117/300], Train Loss: 0.001372
Validation Loss: 0.001356
Epoch [118/300], Train Loss: 0.001464
Validation Loss: 0.001415
Epoch [119/300], Train Loss: 0.001413
Validation Loss: 0.001390
Epoch [120/300], Train Loss: 0.001402
Validation Loss: 0.001364
Epoch [121/300], Train Loss: 0.001381
Validation Loss: 0.001362
Epoch [122/300], Train Loss: 0.001394
Validation Loss: 0.001347
Epoch [123/300], Train Loss: 0.001396
Validation Loss: 0.001369
Epoch [124/300], Train Loss: 0.001382
Validation Loss: 0.001390
Epoch [125/300], Train Loss: 0.001365
Validation Loss: 0.001342
Epoch [126/300], Train Loss: 0.001383
Validation Loss: 0.001346
Epoch [127/300], Train Loss: 0.001477
Validation Loss: 0.001426
Epoch [128/300], Train Loss: 0.001410
Validation Loss: 0.001351
Epoch [129/300], Train Loss: 0.001509
Validation Loss: 0.001401
Epoch [130/300], Train Loss: 0.001439
Validation Loss: 0.001394
Epoch [131/300], Train Loss: 0.001414
Validation Loss: 0.001365
Epoch [132/300], Train Loss: 0.001383
Validation Loss: 0.001345
Epoch [133/300], Train Loss: 0.001374
Validation Loss: 0.001346
Epoch [134/300], Train Loss: 0.001379
Validation Loss: 0.001338
Epoch [135/300], Train Loss: 0.001369
Validation Loss: 0.001335
Epoch [136/300], Train Loss: 0.001368
Validation Loss: 0.001347
Epoch [137/300], Train Loss: 0.001353
Validation Loss: 0.001340
Epoch [138/300], Train Loss: 0.001361
Validation Loss: 0.001336
Epoch [139/300], Train Loss: 0.001454
Validation Loss: 0.001398
Epoch [140/300], Train Loss: 0.001409
Validation Loss: 0.001365
Epoch [141/300], Train Loss: 0.001384
Validation Loss: 0.001334
Epoch [142/300], Train Loss: 0.001365
Validation Loss: 0.001343
Epoch [143/300], Train Loss: 0.001383
Validation Loss: 0.001332
Epoch [144/300], Train Loss: 0.001373
Validation Loss: 0.001327
Epoch [145/300], Train Loss: 0.001368
Validation Loss: 0.001355
Epoch [146/300], Train Loss: 0.001372
Validation Loss: 0.001330
Epoch [147/300], Train Loss: 0.001371
Validation Loss: 0.001327
Epoch [148/300], Train Loss: 0.001362
Validation Loss: 0.001325
Epoch [149/300], Train Loss: 0.001353
Validation Loss: 0.001325
Epoch [150/300], Train Loss: 0.001365
Validation Loss: 0.001335
Epoch [151/300], Train Loss: 0.001352
Validation Loss: 0.001326
Epoch [152/300], Train Loss: 0.001347
Validation Loss: 0.001323
Epoch [153/300], Train Loss: 0.001354
Validation Loss: 0.001327
Epoch [154/300], Train Loss: 0.001440
Validation Loss: 0.001501
Epoch [155/300], Train Loss: 0.001400
Validation Loss: 0.001329
Epoch [156/300], Train Loss: 0.001350
Validation Loss: 0.001324
Epoch [157/300], Train Loss: 0.001356
Validation Loss: 0.001328
Epoch [158/300], Train Loss: 0.001357
Validation Loss: 0.001348
Epoch [159/300], Train Loss: 0.001372
Validation Loss: 0.001454
Epoch [160/300], Train Loss: 0.001428
Validation Loss: 0.001349
Epoch [161/300], Train Loss: 0.001360
Validation Loss: 0.001335
Epoch [162/300], Train Loss: 0.001348
Validation Loss: 0.001329
Early stopping triggered
Evaluating model for: Freezer
Validation MAE: 31.742741 W
Validation MSE: 2153.301025 W²
Validation RMSE: 46.403675 W
Signal Aggregate Error (SAE): 0.000840
Normalized Disaggregation Error (NDE): 0.347826
Training and Validation Loss
Interactive Plot